14 research outputs found

    Interval-valued and intuitionistic fuzzy mathematical morphologies as special cases of L-fuzzy mathematical morphology

    Get PDF
    Mathematical morphology (MM) offers a wide range of tools for image processing and computer vision. MM was originally conceived for the processing of binary images and later extended to gray-scale morphology. Extensions of classical binary morphology to gray-scale morphology include approaches based on fuzzy set theory that give rise to fuzzy mathematical morphology (FMM). From a mathematical point of view, FMM relies on the fact that the class of all fuzzy sets over a certain universe forms a complete lattice. Recall that complete lattices provide for the most general framework in which MM can be conducted. The concept of L-fuzzy set generalizes not only the concept of fuzzy set but also the concepts of interval-valued fuzzy set and Atanassov’s intuitionistic fuzzy set. In addition, the class of L-fuzzy sets forms a complete lattice whenever the underlying set L constitutes a complete lattice. Based on these observations, we develop a general approach towards L-fuzzy mathematical morphology in this paper. Our focus is in particular on the construction of connectives for interval-valued and intuitionistic fuzzy mathematical morphologies that arise as special, isomorphic cases of L-fuzzy MM. As an application of these ideas, we generate a combination of some well-known medical image reconstruction techniques in terms of interval-valued fuzzy image processing

    Fractional derivatives as weighted average of historical values: an application to COVID-19 in Brazil

    Get PDF
    The memory effect is an interesting tool that can be seen in fractional differential equations. To show this clearly, in this paper we prove that the Caputo derivative of a function , as well as the Riemann-Liouville integral and derivative, are proportional to a weighted average of the historical values of or ′. For this, we use the statistical expectation of functions, whose random variable follows a beta distribution. Moreover, through the respective probability density functions, for each operator we specify the weight of the historical values of the function to determine its current value, according to the values of the fractional order of the derivative. Furthermore, to prove the effectiveness of the memory effect to describe real phenomena, we compared a classic model with its fractional version to model COVID-19 in Brazil

    Modified models of morphological neural networks

    No full text
    Orientador: Peter SussnerDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação CientíficaResumo: Redes neurais morfológicas (MNN) são redes neurais artificiais cujos nós executam operações elementares da morfologia matemática (MM). Vários modelos de MNNs e seus respectivos algoritmos de treinamentos têm sido propostos nos últimos anos, incluindo os perceptrons morfológicos(MPs), o perceptron morfológico com dendritos, as memórias associativas morfológicas (fuzzy), as redes neurais morfológicas modulares e as redes neurais de pesos compartilhados e regularizados. Aplicações de MNNs incluem reconhecimento de padrão, previsão de séries temporais, detecção de alvos, auto-localização e processamento de imagens hiperespectrais. Nesta tese, abordamos dois novos modelos de redes neurais morfológicas.O primeiro consiste em uma memória associativa fuzzy denominada KS-FAM, e o segundo representa uma nova versão do perceptron morfológico para problemas de classificação de múltiplas classes, denominado perceptron morfológico com aprendizagem competitiva(MP/CL). Para ambos modelos, investigamos e demonstramos várias propriedades. Em particular para a KS-FAM, caracterizamos as condições para que uma memória seja perfeitamente recordada, assim como a formada saída produzida ao apresentar um padrão de entrada qualquer. Provamos ainda que o algoritmo de treinamento do MP/CL converge em um número finito de passos e que a rede produzida independe da ordem com que os padrões de treinamento são apresentados. Além disso, é garantido que o MP/CL resultante classifica perfeitamente todos os dados de treinamento e não produz regiões de indecisões. Finalmente, comparamos os desempenhos destes modelos com os de outros modelos similares em uma série de experimentos, que incluir e conhecimento de imagens em tons de cinza, para a KS-FAM, e classificação de vários conjuntos de dados disponíveis na internet, para o MP/CLAbstract: Morphological neural networks (MNN) are artificial neural networks whose hidden neurons perform elementary operations of mathematical morphology (MM). Several particular models of MNNs have been proposed in recent years, including morphological perceptrons (MPs), morphological perceptrons with dendrites, (fuzzy) morphological associative memories, modular morphological neural networks as well as morphological shared-weight and regularization neural networks. Applications of MNNs include pattern recognition, time series prediction, target detection, self-location, and hyper-spectral image processing. In this thesis, we present two new models of morphological neural networks. The first one consists of a fuzzy associative memory called KS-FAM. The second one represents a novel version of the morphological perceptron for classification problems with multiple classes called morphological perceptron with competitive learning(MP/CL). For both KS-FAM and MP/CL models, we investigated and showed several properties. In particular, we characterized the conditions for perfect recall using the KS-FAM as well as the outputs produced upon presentation of an arbitrary input patern. In addition, we proved that the learning algorithm of the MP/CL converges in a finite number of steps and that the results produced after the conclusion of the training phase do not depend on the order in which the training patterns are presented to the network. Moreover, the MP/CL is guaranteed to perfectly classify all training data without generating any regions of indecision. Finaly, we compared the performances of our new models and a range of competing models in terms of a series of experiments in gray-scale image recognition (in case of the KS-FAM) and classification using several well-known datasets that are available on the internet (in case of the MP/CL)MestradoMatematica AplicadaMestre em Matemática Aplicad

    'Theta'-FAMs : fuzzy associative memories based on functions-'theta'

    No full text
    Orientador: Peter SussnerTese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação CientíficaResumo: Muitas das memórias associativas fuzzy (FAMs) da literatura correspondem a redes neurais com uma única camada de pesos que armazenam de forma distributiva as informações das associações desejadas. As principais aplicações deste tipo de mémorias associativas são encontradas em sistemas baseados em regras fuzzy. Nesta tese introduzimos a classe de memórias associativas fuzzy-T (T-FAMs) que, em contraste com estes outros modelos, representam redes neurais fuzzy com duas camadas. Caso particulares de T-FAMs, denominadas S-FAMs (duais) e E-FAMs, são baseadas em medidas de subsethood e equivalência fuzzy. Resultados gerais sobre a capacidade de armazenamento e a capacidade de correção de erro das T-FAMs também foram providenciados. Adicionalmente, introduzimos um algoritmo geral de treinamento para T-FAM cuja convergência é sempre garantida. Apresentamos ainda um algoritmo alternativo para treinamento de uma certa classe de E-FAMs que além de ajustar os seus parâmetros também determina automaticamente a topologia da rede. Finalmente, comparamos as taxas de classificação produzidas pelas T-FAMs com alguns classificadores bem conhecidos em diversos problemas de classificação disponíveis na internet. Além disso, aplicamos com sucesso as T-FAMs em um problema de auto-localização de robô móvel baseado em visãoAbstract: Most fuzzy associative memories in the literature correspond to neural networks with a single layer of weights that distributively contains the information about the associations to be stored. The main applications of these types of associative memory can be found in fuzzy rule-base systems. In contrast, we present in this thesis the class of T-fuzzy associative memories (T-FAMs) that represent fuzzy neural networks with two layers. Particular cases of T-FAMs, called (dual) S-FAMs and E-FAMs, are based on fuzzy subsethood and equivalence measures. We provide theoretical results concerning the storage capability and error correction capability of T-FAMs. Furthermore, we introduce a general training algorithm for T-FAM that is guaranteed to converge in a finite numbers of iterations. We also proposed another alternative training algorithm for a certain type of E-FAM that not only adjusts the parameters of the corresponding network but also automatically determines its topology. We compare the classification rates produced by T-FAMs with that ones of some well-known classifiers in several benchmark classification problems that are available on the internet. Finally, we successful apply T-FAM approach to a problem of vision-based selflocalization in mobile roboticsDoutoradoMatematica AplicadaDoutor em Matemática Aplicad

    Morphological perceptrons with competitive learning: Lattice-theoretical framework and constructive learning algorithm

    No full text
    A morphological neural network is generally defined as a type of artificial neural network that performs an elementary operation of mathematical morphology at every node, possibly followed by the application of an activation function. The underlying framework of mathematical morphology can be found in lattice theory. With the advent of granular computing, lattice-based neurocomputing models such as morphological neural networks and fuzzy lattice neurocomputing models are becoming increasingly important since many information granules such as fuzzy sets and their extensions, intervals, and rough sets are lattice ordered. In this paper, we present the lattice-theoretical background and the learning algorithms for morphological perceptrons with competitive learning which arise by incorporating a winner-take-all output layer into the original morphological perceptron model. Several well-known classification problems that are available on the internet are used to compare our new model with a range of classifiers such as conventional multi-layer perceptrons, fuzzy lattice neurocomputing models, k-nearest neighbors, and decision trees. (C) 2010 Elsevier Inc. All rights reserved.A morphological neural network is generally defined as a type of artificial neural network that performs an elementary operation of mathematical morphology at every node, possibly followed by the application of an activation function. The underlying frame1811019291950sem informaçãosem informaçã

    Tunable equivalence fuzzy associative memories

    No full text
    This paper introduces a new class of fuzzy associative memories (FAMs) called tunable equivalence fuzzy associative memories, for short tunable E-FAMs or TE-FAMs, that are determined by the application of parametrized equivalence measures in the hidden nodes. Tunable E-FAMs belong to the class of Theta-FAMs that have recently appeared in the literature. In contrast to previous Theta-FAM models, tunable E-FAMs allow for the extraction of a fundamental memory set from the training data by means of an algorithm that depends on the evaluation of equivalence measures. Furthermore, we are able to optimize not only the weights corresponding to the contributions of the hidden nodes but also the contributions of the attributes of the data by tuning the parametrized equivalence measures used in a TE-FAM model. The computational effort involved in training tunable TE-FAMs is very low compared to the one of the previous Theta-FAM training algorithm. (C) 2015 Elsevier B.V. All rights reserved.This paper introduces a new class of fuzzy associative memories (FAMs) called tunable equivalence fuzzy associative memories, for short tunable E-FAMs or TE-FAMs, that are determined by the application of parametrized equivalence measures in the hidden nodes. Tunable E-FAMs belong to the class of Theta-FAMs that have recently appeared in the literature. In contrast to previous Theta-FAM models, tunable E-FAMs allow for the extraction of a fundamental memory set from the training data by means of an algorithm that depends on the evaluation of equivalence measures. Furthermore, we are able to optimize not only the weights corresponding to the contributions of the hidden nodes but also the contributions of the attributes of the data by tuning the parametrized equivalence measures used in a TE-FAM model. The computational effort involved in training tunable TE-FAMs is very low compared to the one of the previous Theta-FAM training algorithm.29224226

    Theta-fuzzy associative memories (theta-fams)

    No full text
    FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOMost fuzzy associative memories (FAMs) in the literature correspond to neural networks with a single layer of weights that distributively contains the information on associations to be stored. The main applications of these types of associative memory can be found in fuzzy rule-based systems. In contrast, T-fuzzy associative memories (T-FAMs) represent parametrized fuzzy neural networks with a hidden layer and these FAM models extend (dual) S-FAMs and SM-FAMs based on fuzzy subsethood and similarity measures. In this paper, we provide theoretical results concerning the storage capacity and error correction capability of T-FAMs. In addition, we introduce a training algorithm for T-FAMs and we compare the error rates produced by T-FAMs and some well-known classifiers in some benchmark classification problems that are available on the internet. Finally, we apply T-FAMs to a problem of vision-based self-localization in mobile robotics.Most fuzzy associative memories (FAMs) in the literature correspond to neural networks with a single layer of weights that distributively contains the information on associations to be stored. The main applications of these types of associative memory can be found in fuzzy rule-based systems. In contrast, Θ-fuzzy associative memories (Θ-FAMs) represent parametrized fuzzy neural networks with a hidden layer and these FAM models extend (dual) S-FAMs and SM-FAMs based on fuzzy subsethood and similarity measures. In this paper, we provide theoretical results concerning the storage capacity and error correction capability of Θ-FAMs. In addition, we introduce a training algorithm for Θ-FAMs and we compare the error rates produced by Θ-FAMs and some wellknown classifiers in some benchmark classification problems that are available on the internet. Finally, we apply Θ-FAMs to a problem of vision-based self-localization in mobile robotics.232313326FAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULOFAPESP - FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO2009/16284-2; 2011/10014-

    Calculus for linearly correlated fuzzy function using fréchet derivative and riemann integral

    No full text
    In this manuscript we study integration and derivative theories for interactive fuzzy processes. These theories are based on the Fréchet derivative and the Riemann integral. In addition, we present a connection between these two theories, i.e., some problems may be formulated in both ways. We establish the fundamental theorem of calculus, theorem of existence and the local uniqueness of the solution of fuzzy differential equations and some techniques to solve fuzzy initial value problems. To illustrate the usefulness of the developed theory, we investigate the radioactive decay model51221923
    corecore